169 research outputs found

    Exploring the Design Space of Extra-Linguistic Expression for Robots

    Full text link
    In this paper, we explore the new design space of extra-linguistic cues inspired by graphical tropes used in graphic novels and animation to enhance the expressiveness of social robots. To achieve this, we identified a set of cues that can be used to generate expressions, including smoke/steam/fog, water droplets, and bubbles. We prototyped devices that can generate these fluid expressions for a robot and conducted design sessions where eight designers explored the use and utility of the cues in conveying the robot's internal states in various design scenarios. Our analysis of the 22 designs, the associated design justifications, and the interviews with designers revealed patterns in how each cue was used, how they were combined with nonverbal cues, and where the participants drew their inspiration from. These findings informed the design of an integrated module called EmoPack, which can be used to augment the expressive capabilities of any robot platform

    Toward Family-Robot Interactions: A Family-Centered Framework in HRI

    Full text link
    As robotic products become more integrated into daily life, there is a greater need to understand authentic and real-world human-robot interactions to inform product design. Across many domestic, educational, and public settings, robots interact with not only individuals and groups of users, but also families, including children, parents, relatives, and even pets. However, products developed to date and research in human-robot and child-robot interactions have focused on the interaction with their primary users, neglecting the complex and multifaceted interactions between family members and with the robot. There is a significant gap in knowledge, methods, and theories for how to design robots to support these interactions. To inform the design of robots that can support and enhance family life, this paper provides (1) a narrative review exemplifying the research gap and opportunities for family-robot interactions and (2) an actionable family-centered framework for research and practices in human-robot and child-robot interaction

    Sprout: Designing Expressivity for Robots Using Fiber-Embedded Actuator

    Full text link
    In this paper, we explore how techniques from soft robotics can help create a new form of robot expression. We present Sprout, a soft expressive robot that conveys its internal states by changing its body shape. Sprout can extend, bend, twist, and expand using fiber-embedded actuators integrated into its construction. These deformations enable Sprout to express its internal states, for example, by expanding to express anger and bending its body sideways to express curiosity. Through two user studies, we investigated how users interpreted Sprout's expressions, their perceptions of Sprout, and their expectations from future iterations of Sprout's design. We argue that the use of soft actuators opens a novel design space for robot expressions to convey internal states, emotions, and intent.Comment: 10 pages, 5 figure

    Factors that Affect Personalization of Robots for Older Adults

    Full text link
    We introduce a taxonomy of important factors to consider when designing interactions with an assistive robot in a senior living facility. These factors are derived from our reflection on two field studies and are grouped into the following high-level categories: primary user (residents), care partners, robot, facility and external circumstances. We outline how multiple factors in these categories impact different aspects of personalization, such as adjusting interactions based on the unique needs of a resident or modifying alerts about the robot's status for different care partners. This preliminary taxonomy serves as a framework for considering how to deploy personalized assistive robots in the complex caregiving ecosystem.Comment: Presented at CONCATENATE Workshop at HRI 2023 in Stockholm, Swede

    Understanding Large-Language Model (LLM)-powered Human-Robot Interaction

    Full text link
    Large-language models (LLMs) hold significant promise in improving human-robot interaction, offering advanced conversational skills and versatility in managing diverse, open-ended user requests in various tasks and domains. Despite the potential to transform human-robot interaction, very little is known about the distinctive design requirements for utilizing LLMs in robots, which may differ from text and voice interaction and vary by task and context. To better understand these requirements, we conducted a user study (n = 32) comparing an LLM-powered social robot against text- and voice-based agents, analyzing task-based requirements in conversational tasks, including choose, generate, execute, and negotiate. Our findings show that LLM-powered robots elevate expectations for sophisticated non-verbal cues and excel in connection-building and deliberation, but fall short in logical communication and may induce anxiety. We provide design implications both for robots integrating LLMs and for fine-tuning LLMs for use with robots.Comment: 10 pages, 4 figures. Callie Y. Kim and Christine P. Lee contributed equally to the work. To be published in Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI '24), March 11--14, 2024, Boulder, CO, US

    Making Informed Decisions: Supporting Cobot Integration Considering Business and Worker Preferences

    Full text link
    Robots are ubiquitous in small-to-large-scale manufacturers. While collaborative robots (cobots) have significant potential in these settings due to their flexibility and ease of use, proper integration is critical to realize their full potential. Specifically, cobots need to be integrated in ways that utilize their strengths, improve manufacturing performance, and facilitate use in concert with human workers. Effective integration requires careful consideration and the knowledge of roboticists, manufacturing engineers, and business administrators. We propose an approach involving the stages of planning, analysis, development, and presentation, to inform manufacturers about cobot integration within their facilities prior to the integration process. We contextualize our approach in a case study with an SME collaborator and discuss insights learned.Comment: 9 pages, 9 figures. To be published in Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI '24

    Periscope: A Robotic Camera System to Support Remote Physical Collaboration

    Full text link
    We investigate how robotic camera systems can offer new capabilities to computer-supported cooperative work through the design, development, and evaluation of a prototype system called Periscope. With Periscope, a local worker completes manipulation tasks with guidance from a remote helper who observes the workspace through a camera mounted on a semi-autonomous robotic arm that is co-located with the worker. Our key insight is that the helper, the worker, and the robot should all share responsibility of the camera view--an approach we call shared camera control. Using this approach, we present a set of modes that distribute the control of the camera between the human collaborators and the autonomous robot depending on task needs. We demonstrate the system's utility and the promise of shared camera control through a preliminary study where 12 dyads collaboratively worked on assembly tasks. Finally, we discuss design and research implications of our work for future robotic camera systems that facilitate remote collaboration.Comment: This is a pre-print of the article accepted for publication in PACM HCI and will be presented at CSCW 202
    • …
    corecore